Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
In this article we document the current analysis software training and onboarding activities in several High Energy Physics (HEP) experiments: ATLAS, CMS, LHCb, Belle II and DUNE. Fast and efficient onboarding of new collaboration members is increasingly important for HEP experiments. With rapidly increasing data volumes and larger collaborations the analyses and consequently, the related software, become ever more complex. This necessitates structured onboarding and training. Recognizing this, a meeting series was held by the HEP Software Foundation (HSF) in 2022 for experiments to showcase their initiatives. Here we document and analyze these in an attempt to determine a set of key considerations for future HEP experiments.more » « lessFree, publicly-accessible full text available February 9, 2026
-
De_Vita, R; Espinal, X; Laycock, P; Shadura, O (Ed.)Providing computing training to the next generation of physicists is the principal driver for a biannual multi-day training workshop hosted by the DUNE Computing Consortium. Materials are cast in a Software Carpentry’s template, and topics have included storage space, data management, LArSoft, grid job submission and monitoring. Moreover, experts provide extended breakout sessions to demonstrate the fundamentals of the unique software used in HEP analysis. Each session uses live documents for real time correspondence, and are captured on Zoom; afterwards, videos are embedded on the corresponding web-pages for review. As a GitHub repository, shared editing of the learning modules is straightforward, and provides a trusted framework to extend to other training topics in the future. An overview of the tutorials as well as the machinery used, along with survey statistics and lessons learned is presented.more » « less
-
Dolezal, Zdenek (Ed.)The MINERvA experiment has completed its physics run using the 6 GeV, on-axis NuMI ME beam at Fermilab. The experiment received a total of 12 x 10^20 protons on target in both neutrino and antineutrino mode running. This allows MINERvA a new level of statistics in neutrino interaction measurements with the ability to measure multi-dimensional differential cross sections. In addition, in order to make the most of this jump in statistics, a new level of precision in fluxprediction has been achieved. We present results from MINERvA's Medium Energy (ME) physics program, including the new kinematic regimes that are now accessible.more » « less
-
Doglioni, C.; Kim, D.; Stewart, G.A.; Silvestris, L.; Jackson, P.; Kamleh, W. (Ed.)This paper is based on a talk given at Computing in High Energy Physics in Adelaide, South Australia, Australia in November 2019. It is partially intended to explain the context of DUNE Computing for computing specialists. The Deep Underground Neutrino Experiment (DUNE) collaboration consists of over 180 institutions from 33 countries. The experiment is in preparation now, with commissioning of the first 10kT fiducial volume Liquid Argon TPC expected over the period 2025-2028 and a long data taking run with 4 modules expected from 2029 and beyond. An active prototyping program is already in place with a short test-beam run with a 700T, 15,360 channel prototype of single-phase readout at the Neutrino Platform at CERN in late 2018 and tests of a similar sized dual-phase detector scheduled for mid-2019. The 2018 test-beam run was a valuable live test of our computing model. The detector produced raw data at rates of up to 2GB/s. These data were stored at full rate on tape at CERN and Fermilab and replicated at sites in the UK and Czech Republic. In total 1.2 PB of raw data from beam and cosmic triggers were produced and reconstructed during the six week testbeam run. Baseline predictions for the full DUNE detector data, starting in the late 2020’s are 30-60 PB of raw data per year. In contrast to traditional HEP computational problems, DUNE’s Liquid Argon TPC data consist of simple but very large (many GB) 2D data objects which share many characteristics with astrophysical images. This presents opportunities to use advances in machine learning and pattern recognition as a frontier user of High Performance Computing facilities capable of massively parallel processing.more » « less
An official website of the United States government
